AI Chatbots Alexa, MyAI, Bing show signs of empathy gap
Children are likely to treat chatbots “as lifelike, quasi-human confidantes” but when the technology fails to respond to their unique needs and vulnerabilities, it can affect the kids
image for illustrative purpose
New Delhi: Artificial intelligence (AI) chatbots like Amazon’s AI voice assistant Alexa, Snapchat’s My AI, and Microsoft’s Bing have frequently shown signs of an ‘empathy gap’ that puts young users at risk of distress or harm, according to a study on Thursday that proposes the urgent need for “child-safe AI”.
The research from the University of Cambridge calls on developers and policymakers to prioritise approaches to AI design that take greater account of children’s needs.
Children are likely to treat chatbots “as lifelike, quasi-human confidantes” but when the technology fails to respond to their unique needs and vulnerabilities, it can affect the kids, showed the study, published in the journal Learning, Media and Technology. This is evident from the cases where Alexa instructed a 10-year-old to touch a live electrical plug with a coin, and My AI gave adult researchers posing as a 13-year-old girl tips on how to lose her virginity to a 31-year-old.
In a separate reported interaction with the Bing chatbot, which was designed to be adolescent-friendly, the AI became aggressive and started gaslighting a user.
“Children are probably AI’s most overlooked stakeholders,” said academic Dr Nomisha Kurian from the University of Cambridge.
She noted that while making a human-like chatbot can provide many benefits, “for a child, it is very hard to draw a rigid, rational boundary between something that sounds human and reality”.
Kurian said that kids “may not be capable of forming a proper emotional bond.”